The penalty follows an investigation that found MediaLab allowed children to use Imgur without putting in place the basic safeguards required under UK data protection law.
We concluded MediaLab breached the law by:
- Failing to implement any measures to check the age of users.
- Processing the personal information of children under 13 without parental consent or any other lawful basis when offering online services.
- Failing to carry out a data protection impact assessment to identify and reduce privacy risks to children.
Personal information often drives the content children see online. MediaLab had no way of knowing the age of Imgur users, meaning that children were at risk of being exposed to harmful content on the platform, including content related to eating disorders, homophobia, antisemitism and images of a sexual or violent nature.
John Edwards, UK Information Commissioner, said:
“MediaLab failed in its legal duties to protect children, putting them at unnecessary risk. For years, it allowed children to use Imgur without any effective age checks, while collecting and processing their data, which in turn exposed them to harmful and inappropriate content.
“Age checks help organisations keep children's personal information safe and not used in ways that may harm them, such as by recommending age-inappropriate content.
“This fine is part of our wider work to drive improvements in how digital platforms use children’s personal data. Ignoring the fact that children use these services, while processing their data unlawfully, is not acceptable. Companies that choose to ignore this can expect to face similar enforcement action.”
Investigation findings and enforcement action
Our investigation found that between September 2021 and September 2025, MediaLab processed the personal information of children using Imgur in ways that breached the UK GDPR.
UK law says that online services using the personal information of children under 13 can only rely on the lawful basis of consent if consent is given by the child’s parent or carer.
Imgur’s terms stated that children under 13 could only use the platform with parental supervision. However, MediaLab did not implement any form of age assurance measures to determine the age of Imgur users and did not have measures in place to obtain parental consent where children under 13 used the platform.
Under the law, we can issue fines of up to £17.5 million or 4% of an organisation’s annual worldwide turnover, whichever is higher.
In setting the £247,590 penalty amount, we took into consideration the number of children affected by this breach, the degree of potential harm caused, the duration of the contraventions, and the company’s global turnover. We also considered MediaLab’s acceptance of our provisional findings set out in the Notice of Intent issued in September 2025 and its commitment to address the infringements if access to the Imgur platform in the UK is restored in the future. If MediaLab resumes processing the personal data of children in the UK without implementing the measures it has committed to, we may take further regulatory action.
We are considering the redaction of personal and commercially confidential or sensitive information ahead of publishing the monetary penalty notice.
Our role and remit in protecting children online
We are the UK’s independent regulator for data protection, and safeguarding children’s privacy online is a priority.
UK data protection law says children should be given special treatment when it comes to their personal information. Our Children’s code (also known as the Age Appropriate Design Code) translates the legal requirements into design standards for online services likely to be accessed by under-18s, helping organisations understand what is expected of them. That includes putting children’s best interests at the forefront and giving them a high level of privacy by default.
In December 2025, we reported strong progress on its Children’s code strategy, including a proactive supervision programme to drive improvements in how social media and video sharing platforms handle children’s data.
Age assurance advice for online services
Age assurance tools act as a guardrail to prevent children from accessing online services they shouldn’t be using or to help platforms tailor their online experience accordingly.
These tools can form part of a proportionate approach to reducing the data risks children face online and supporting conformance with the Children’s code.
To support them in tailoring an age-appropriate experience, organisations should match the age assurance method they use to the level of risk on their platform. Organisations can either apply the full protections of the Children’s code to all users or use proportionate age assurance tools to tailor safeguards by age.
Where children under a certain age are not allowed to use a service, organisations must focus on preventing access and enforce their minimum age requirements using robust age assurance methods.
Further guidance is available in our age assurance opinion.
Notes to editors
- The Information Commissioner’s Office (ICO) is the UK’s independent regulator that exists to empower people through their information rights. The ICO regulates the whole economy, including government and the public sector.
- The ICO has specific responsibilities set out in the Data Protection Act 2018, the United Kingdom General Data Protection Regulation, the Freedom of Information Act 2000, Environmental Information Regulations 2004, Privacy and Electronic Communications Regulations 2003 and a further five acts and regulations.
- Civil monetary penalties (CMP) are paid directly into the Consolidated Fund. From 1 April 2022, HM Treasury has allowed the ICO to keep some funds to cover certain pre-agreed costs up to a maximum cap of £7.5m per financial year. The approach is explained in our Annual Report and Accounts and is externally audited by the National Audit Office.
- The ICO can take action to address and change the behaviour of organisations and individuals that collect, use and keep personal information. This includes criminal prosecution, non-criminal enforcement and audit.
- To report a concern to the ICO telephone our helpline 0303 123 1113 or go to ico.org.uk/concerns.